Lower bounds for integration and recovery in L2
نویسندگان
چکیده
Function values are, in some sense, “almost as good” general linear information for L2-approximation (optimal recovery, data assimilation) of functions from a reproducing kernel Hilbert space. This was recently proved by new upper bounds on the sampling numbers under assumption that singular embedding this space into L2 are square-summable. Here we mainly prove lower bounds. In particular behave worse than approximation Sobolev spaces with small smoothness. Hence there can be logarithmic gap also case where We first integration problem, again rather classical periodic univariate functions.
منابع مشابه
On lower bounds for the L2-discrepancy
The L2-discrepancy measures the irregularity of the distribution of a finite point set. In this note, we prove lower bounds for the L2-discrepancy of arbitrary N-point sets. Our main focus is on the two-dimensional case. Asymptotic upper and lower estimates of the L2-discrepancy in dimension 2 are well known, and are of the sharp order √ logN . Nevertheless, the gap in the constants between the...
متن کاملAlgorithms and Lower Bounds for Sparse Recovery
We consider the following k-sparse recovery problem: design a distribution of m× n matrix A, such that for any signal x, given Ax with high probability we can efficiently recover x̂ satisfying ‖x− x̂‖1 ≤ C mink-sparse x′ ‖x− x‖1. It is known that there exist such distributions with m = O(k log(n/k)) rows; in this thesis, we show that this bound is tight. We also introduce the set query algorithm,...
متن کاملUpper and lower bounds for numerical radii of block shifts
For an n-by-n complex matrix A in a block form with the (possibly) nonzero blocks only on the diagonal above the main one, we consider two other matrices whose nonzero entries are along the diagonal above the main one and consist of the norms or minimum moduli of the diagonal blocks of A. In this paper, we obtain two inequalities relating the numeical radii of these matrices and also determine ...
متن کاملLower Bounds for Adaptive Sparse Recovery
We give lower bounds for the problem of stable sparse recovery from adaptive linear measurements. In this problem, one would like to estimate a vector x ∈ R from m linear measurements A1x, . . . , Amx. One may choose each vector Ai based on A1x, . . . , Ai−1x, and must output x̂ satisfying ‖x̂− x‖p ≤ (1 + ) min k-sparse x′ ‖x− x‖p with probability at least 1−δ > 2/3, for some p ∈ {1, 2}. For p = ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Complexity
سال: 2022
ISSN: ['1090-2708', '0885-064X']
DOI: https://doi.org/10.1016/j.jco.2022.101662